215 research outputs found

    Weiss–Weinstein Bound for Data-Aided Carrier Estimation

    No full text
    International audienceThis letter investigates Bayesian bounds on the mean-square error (MSE) applied to a data-aided carrier estimation problem. The presented bounds are derived from a covariance inequality principle: the so-calledWeiss andWeinstein family. These bounds are of utmost interest to find the fundamental MSE limits of an estimator, even for critical scenarios (low signal-to-noise ratio and/or low number of observations). In a data-aided carrier estimation problem, a closed-form expression of the Weiss–Weinstein bound (WWB) that is known to be the tightest bound of the Weiss and Weinstein family is given. A comparison with the maximum likelihood estimator and the other bounds of the Weiss and Weinstein family is given. The WWB is shown to be an efficient tool to approximate this estimator's MSE and to predict the well-known threshold effect

    Contributions aux bornes infĂ©rieures de l’erreur quadratique moyenne en traitement du signal

    Get PDF
    A l’aide des bornes infĂ©rieures de l’erreur quadratique moyenne, la caractĂ©risation du dĂ©crochement des estimateurs, l’analyse de la position optimale des capteurs dans un rĂ©seau ainsi que les limites de rĂ©solution statistiques sont Ă©tudiĂ©es dans le contexte du traitement d’antenne et du radar

    Lower bounds on the mean square error derived from mixture of linear and non-linear transformations of the unbiasness definition

    No full text
    International audienceIt is well known that in non-linear estimation problems the ML estimator exhibits a threshold effect, i.e. a rapid deterioration of estimation accuracy below a certain SNR or number of snapshots. This effect is caused by outliers and is not captured by standard tools such as the CramÂŽer-Rao bound (CRB). The search of the SNR threshold value can be achieved with the help of approximations of the Barankin bound (BB) proposed by many authors. These approximations result from a linear transformation (discrete or integral) of the uniform unbiasness constraint introduced by Barankin. Nevertheless, non-linear transformations can be used as well for some class of p.d.f. including the Gaussian case. The benefit is their combination with existing linear transformation to get tighter lower bounds improving the SNR threshold prediction

    A NEW DERIVATION OF THE BAYESIAN BOUNDS FOR PARAMETER ESTIMATION

    No full text
    International audienceThis paper deals with minimal bounds in the Bayesian context. We express the minimum mean square error of the conditional mean estimator as the solution of a continuum constrained optimization problem. And, by relaxing these constraints, we obtain the bounds of the Weiss-Weinstein family. Moreover, this method enables us to derive new bounds as the Bayesian version of the deterministic Abel bound

    NON ASYMPTOTIC EFFICIENCY OF A MAXIMUM LIKELIHOOD ESTIMATOR AT FINITE NUMBER OF SAMPLES

    No full text
    International audienceIn estimation theory, the asymptotic (in the number of samples) efficiency of the Maximum Likelihood (ML) estimator is a well known result [1]. Nevertheless, in some scenarios, the number of snapshots may be small. We recently investigated the asymptotic behavior of the Stochastic ML (SML) estimator at high Signal to Noise Ratio (SNR) and finite number of samples [2] in the array processing framework: we proved the non-Gaussiannity of the SML estimator and we obtained the analytical expression of the variance for the single source case. In this paper, we generalize these results to multiple sources, and we obtain variance expressions which demonstrate the non-efficiency of SML estimates

    MSE lower bounds for deterministic parameter estimation

    Get PDF
    This paper presents a simple approach for deriving computable lower bounds on the MSE of deterministic parameter estimators with a clear interpretation of the bounds. We also address the issue of lower bounds tightness in comparison with the MSE of ML estimators and their ability to predict the SNR threshold region. Last, as many practical estimation problems must be regarded as joint detection-estimation problems, we remind that the estimation performance must be conditional on detection performance

    CRLB under K-distributed observation with parameterized mean

    No full text
    International audienceA semi closed-form expression of the Fisher information matrix in the context of K-distributed observations with parameterized mean is given and related to the classical, i.e. Gaussian case. This connection is done via a simple multiplicative factor, which only depends on the intrinsic parameters of the texture and the size of the observation vector. Finally, numerical simulation is provided to corroborate the theoretical analysi

    Non-efficacité et non-gaussianité asymptotiques d'un estimateur du maximum de vraisemblance à fort rapport signal sur bruit

    No full text
    National audienceEn thĂ©orie de l'estimation, dans le cas d'observations indĂ©pendantes de mĂȘmes densitĂ©s de probabilitĂ©, l'efficacitĂ© asymptotique en le nombre T d'observations de la mĂ©thode du Maximum de Vraisemblance (MV) est un rĂ©sultat bien connu qui permet d'apprĂ©hender ses performances lorsque T est grand. Dans certaines situations, le nombre d'observations peut ĂȘtre faible et ce rĂ©sultat ne s'applique plus. Dans le cadre du traitement d'antenne et d'une modĂ©lisation stochastique des signaux Ă©mis par les sources, nous remĂ©dions Ă  cette lacune lorsque le Rapport Signal sur Bruit (RSB) est grand. Nous montrons que dans cette situation, l'estimateur du MV est asymptotiquement (en RSB) non-efficace et non-gaussien
    • 

    corecore